Deteriorating Convergence for Asynchronous Methods on Linear Least Sqare Problems
نویسندگان
چکیده
A block iterative method is used for solving linear least squares problems. The subproblems are solved asynchronously on a distributed memory multiprocessor. It is observed that an increased number of processors results in deteriorating rate of convergence. This deteriorating convergence is illustrated by numerical experiments. The deterioration of the convergence can be explained by contamination of the residual. Our purpose is to show that the residual is contaminated by old information. The issues investigated here are the eeect of the number of processors, the role of essential neighbors, and synchronization. The characterization of old information remains an open problem.
منابع مشابه
Asynchronous Methods and Least Squares: An Example of Deteriorating Convergence
We use a block iterative method for solving linear least squares problems. The subproblems are solved asynchronously on a distributed memory multiprocessor. It is observed that increased number of processors results in deteriorating convergence. We illustrate the deteriorating convergence by some numerical experiments. The deterioration of the convergence can be explained by contamination of th...
متن کاملThe Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM
We introduce the Stochastic Asynchronous Proximal Alternating Linearized Minimization (SAPALM) method, a block coordinate stochastic proximal-gradient method for solving nonconvex, nonsmooth optimization problems. SAPALM is the first asynchronous parallel optimization method that provably converges on a large class of nonconvex, nonsmooth problems. We prove that SAPALM matches the best known ra...
متن کاملAsynchronous Stochastic Proximal Methods for Nonconvex Nonsmooth Optimization
We study stochastic algorithms for solving non-convex optimization problems with a convex yet possibly non-smooth regularizer, which nd wide applications in many practical machine learning applications. However, compared to asynchronous parallel stochastic gradient descent (AsynSGD), an algorithm targeting smooth optimization, the understanding of the behavior of stochastic algorithms for the n...
متن کاملHarmonics Estimation in Power Systems using a Fast Hybrid Algorithm
In this paper a novel hybrid algorithm for harmonics estimation in power systems is proposed. The estimation of the harmonic components is a nonlinear problem due to the nonlinearity of phase of sinusoids in distorted waveforms. Most researchers implemented nonlinear methods to extract the harmonic parameters. However, nonlinear methods for amplitude estimation increase time of convergence. Hen...
متن کاملAnalysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server
This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems. The algorithm can handle both general convex (possibly non-smooth) regularizers and general convex constraints. When the empirical data loss is strongly convex, we establish linear convergence rate, give explicit expr...
متن کامل